Most of existing work learn sentiment-specific word representation forimproving Twitter sentiment classification, which encoded both n-gram anddistant supervised tweet sentiment information in learning process. They assumeall words within a tweet have the same sentiment polarity as the whole tweet,which ignores the word its own sentiment polarity. To address this problem, wepropose to learn sentiment-specific word embedding by exploiting both lexiconresource and distant supervised information. We develop a multi-levelsentiment-enriched word embedding learning method, which uses parallelasymmetric neural network to model n-gram, word level sentiment and tweet levelsentiment in learning process. Experiments on standard benchmarks show ourapproach outperforms state-of-the-art methods.
展开▼